Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
End-to-end Chinese speech recognition system using bidirectional long short-term memory networks and weighted finite-state transducers
YAO Yu, RYAD Chellali
Journal of Computer Applications    2018, 38 (9): 2495-2499.   DOI: 10.11772/j.issn.1001-9081.2018020402
Abstract1939)      PDF (821KB)(557)       Save
For the assumption of unreasonable conditions in speech recognition by Hidden Markov Model (HMM), the ability of sequence modeling of recurrent neural networks was further studied, an acoustic model based on Bidirectional Long Short-Term Memory (BLSTM) neural networks was proposed. The training criterion based on Connectionist Temporal Classification (CTC) was successfully applied to the acoustic model training, and an end-to-end Chinese speech recognition system was built which does not rely on HMM. Meanwhile, a speech decoding method based on Weighted Finite-State Transducer (WFST) was designed to effectively solve the problem that lexicon and language model are difficult to integrate into the decoding process. Compared with the traditional GMM-HMM system and hybrid DNN-HMM system, the experimental results show that the end-to-end system not only significantly reduces the recognition error rate, but also significantly improves the speech decoding speed, indicating that the proposed acoustic model can effectively enhance the model discrimination and optimize the system structure.
Reference | Related Articles | Metrics
Improved grey wolf optimizer algorithm using dynamic weighting and probabilistic disturbance strategy
CHEN Chuang, Ryad CHELLALI, XING Yin
Journal of Computer Applications    2017, 37 (12): 3493-3497.   DOI: 10.11772/j.issn.1001-9081.2017.12.3493
Abstract581)      PDF (769KB)(672)       Save
The basic Grey Wolf Optimizer (GWO) algorithm is easy to fall into local optimum, which leads to low search precision. In order to solve the problem, an Improved GWO (IGWO) was proposed. On the one hand, the position vector updating equation was dynamically adjusted by introducing weighting factor derived from coefficient vector of the GWO algorithm. On the other hand, the probabilistic disturbance strategy was adopted to increase the population diversity of the algorithm at later stage of iteration, thus the ability of the algorithm for jumping out of the local optimum was enhanced. The simulation experiments were carried out on multiple benchmark test functions. The experimental results show that, compared with the GWO algorithm, Hybrid GWO (HGWO) algorithm, Gravitational Search Agorithm (GSA) and Differential Evolution (DE) algorithm, the proposed IGWO can effectively get rid of local convergence and has obvious advantages in search precision, algorithm stability and convergence speed.
Reference | Related Articles | Metrics